subretinal injection
Improving Needle Penetration via Precise Rotational Insertion Using Iterative Learning Control
Foroutani, Yasamin, Mousavi-Motlagh, Yasamin, Barzelay, Aya, Tsao, Tsu-Chin
Abstract--Achieving precise control of robotic tool paths is often challenged by inherent system misalignments, unmodeled dynamics, and actuation inaccuracies. This work introduces an Iterative Learning Control (ILC) strategy to enable precise rotational insertion of a tool during robotic surgery, improving penetration efficacy and safety compared to straight insertion tested in subretinal injection. A 4 degree of freedom (DOF) robot manipulator is used, where misalignment of the fourth joint complicates the simple application of needle rotation, motivating an ILC approach that iteratively adjusts joint commands based on positional feedback. The process begins with calibrating the forward kinematics for the chosen surgical tool to achieve higher accuracy, followed by successive ILC iterations guided by Optical Coherence T omography (OCT) volume scans to measure the error and refine control inputs. Experimental results, tested on subretinal injection tasks on ex vivo pig eyes, show that the optimized trajectory resulted in higher success rates in tissue penetration and subretinal injection compared to straight insertion, demonstrating the effectiveness of ILC in overcoming misalignment challenges. This approach offers potential applications for other high precision robot tasks requiring controlled insertions as well. Accurate and precise control of movement is fundamental to many scientific fields [1], but it becomes even more critical in surgical applications where even minor deviations can significantly impact outcomes. Surgical procedures often demand sub-millimeter accuracy, especially in areas involving delicate tissues and confined spaces, such as ophthalmology. However, consistently achieving this level of precision can be challenging due to the inherent limitations of human motor skills, such as involuntary tremors and fatigue [2]. These limitations are amplified in intraocular microsurgery, requiring not only steady hands, but also enhanced sensory feedback and hand-eye coordination.
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- Asia > Japan > Honshū > Tōhoku > Fukushima Prefecture > Fukushima (0.04)
- Health & Medicine > Surgery (1.00)
- Health & Medicine > Therapeutic Area > Ophthalmology/Optometry (0.66)
Towards Motion Compensation in Autonomous Robotic Subretinal Injections
Arikan, Demir, Zhang, Peiyao, Sommersperger, Michael, Dehghani, Shervin, Esfandiari, Mojtaba, Taylor, Russel H., Nasseri, M. Ali, Gehlbach, Peter, Navab, Nassir, Iordachita, Iulian
Exudative (wet) age-related macular degeneration (AMD) is a leading cause of vision loss in older adults, typically treated with intravitreal injections. Emerging therapies, such as subretinal injections of stem cells, gene therapy, small molecules or RPE cells require precise delivery to avoid damaging delicate retinal structures. Autonomous robotic systems can potentially offer the necessary precision for these procedures. This paper presents a novel approach for motion compensation in robotic subretinal injections, utilizing real-time Optical Coherence Tomography (OCT). The proposed method leverages B$^{5}$-scans, a rapid acquisition of small-volume OCT data, for dynamic tracking of retinal motion along the Z-axis, compensating for physiological movements such as breathing and heartbeat. Validation experiments on \textit{ex vivo} porcine eyes revealed challenges in maintaining a consistent tool-to-retina distance, with deviations of up to 200 $\mu m$ for 100 $\mu m$ amplitude motions and over 80 $\mu m$ for 25 $\mu m$ amplitude motions over one minute. Subretinal injections faced additional difficulties, with horizontal shifts causing the needle to move off-target and inject into the vitreous. These results highlight the need for improved motion prediction and horizontal stability to enhance the accuracy and safety of robotic subretinal procedures.
- Europe > Germany > North Rhine-Westphalia > Upper Bavaria > Munich (0.05)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- North America > United States > Maryland > Baltimore (0.04)
- (2 more...)
Real-time Deformation-aware Control for Autonomous Robotic Subretinal Injection under iOCT Guidance
Arikan, Demir, Zhang, Peiyao, Sommersperger, Michael, Dehghani, Shervin, Esfandiari, Mojtaba, Taylor, Russel H., Nasseri, M. Ali, Gehlbach, Peter, Navab, Nassir, Iordachita, Iulian
Robotic platforms provide repeatable and precise tool positioning that significantly enhances retinal microsurgery. Integration of such systems with intraoperative optical coherence tomography (iOCT) enables image-guided robotic interventions, allowing to autonomously perform advanced treatment possibilities, such as injecting therapeutic agents into the subretinal space. Yet, tissue deformations due to tool-tissue interactions are a major challenge in autonomous iOCT-guided robotic subretinal injection, impacting correct needle positioning and, thus, the outcome of the procedure. This paper presents a novel method for autonomous subretinal injection under iOCT guidance that considers tissue deformations during the insertion procedure. This is achieved through real-time segmentation and 3D reconstruction of the surgical scene from densely sampled iOCT B-scans, which we refer to as B5-scans, to monitor the positioning of the instrument regarding a virtual target layer defined at a relative position between the ILM and RPE. Our experiments on ex-vivo porcine eyes demonstrate dynamic adjustment of the insertion depth and overall improved accuracy in needle positioning compared to previous autonomous insertion approaches. Compared to a 35% success rate in subretinal bleb generation with previous approaches, our proposed method reliably and robustly created subretinal blebs in all our experiments.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- Europe > Germany > North Rhine-Westphalia > Upper Bavaria > Munich (0.04)
- North America > United States > Maryland > Baltimore (0.04)
- (2 more...)
EyeLS: Shadow-Guided Instrument Landing System for Intraocular Target Approaching in Robotic Eye Surgery
Yang, Junjie, Zhao, Zhihao, Shen, Siyuan, Zapp, Daniel, Maier, Mathias, Huang, Kai, Navab, Nassir, Nasseri, M. Ali
Robotic ophthalmic surgery is an emerging technology to facilitate high-precision interventions such as retina penetration in subretinal injection and removal of floating tissues in retinal detachment depending on the input imaging modalities such as microscopy and intraoperative OCT (iOCT). Although iOCT is explored to locate the needle tip within its range-limited ROI, it is still difficult to coordinate iOCT's motion with the needle, especially at the initial target-approaching stage. Meanwhile, due to 2D perspective projection and thus the loss of depth information, current image-based methods cannot effectively estimate the needle tip's trajectory towards both retinal and floating targets. To address this limitation, we propose to use the shadow positions of the target and the instrument tip to estimate their relative depth position and accordingly optimize the instrument tip's insertion trajectory until the tip approaches targets within iOCT's scanning area. Our method succeeds target approaching on a retina model, and achieves an average depth error of 0.0127 mm and 0.3473 mm for floating and retinal targets respectively in the surgical simulator without damaging the retina.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Asia > China > Guangdong Province > Guangzhou (0.04)
Steady-Hand Eye Robot 3.0: Optimization and Benchtop Evaluation for Subretinal Injection
Alamdar, Alireza, Usevitch, David E., Wu, Jiahao, Taylor, Russell H., Gehlbach, Peter, Iordachita, Iulian
Subretinal injection methods and other procedures for treating retinal conditions and diseases (many considered incurable) have been limited in scope due to limited human motor control. This study demonstrates the next generation, cooperatively controlled Steady-Hand Eye Robot (SHER 3.0), a precise and intuitive-to-use robotic platform achieving clinical standards for targeting accuracy and resolution for subretinal injections. The system design and basic kinematics are reported and a deflection model for the incorporated delta stage and validation experiments are presented. This model optimizes the delta stage parameters, maximizing the global conditioning index and minimizing torsional compliance. Five tests measuring accuracy, repeatability, and deflection show the optimized stage design achieves a tip accuracy of <30 $\mu$m, tip repeatability of 9.3 $\mu$m and 0.02{\deg}, and deflections between 20-350 $\mu$m/N. Future work will use updated control models to refine tip positioning outcomes and will be tested on in vivo animal models.
- North America > United States > Pennsylvania > Philadelphia County > Philadelphia (0.04)
- North America > United States > Maryland > Baltimore (0.04)
- Europe > Belgium > Flanders > Flemish Brabant > Leuven (0.04)
- Research Report > New Finding (0.46)
- Research Report > Experimental Study (0.46)
- Health & Medicine > Therapeutic Area > Ophthalmology/Optometry (1.00)
- Health & Medicine > Pharmaceuticals & Biotechnology (1.00)
- Health & Medicine > Surgery (0.94)
- Health & Medicine > Therapeutic Area > Hematology > Stem Cells (0.68)
Robotic Navigation Autonomy for Subretinal Injection via Intelligent Real-Time Virtual iOCT Volume Slicing
Dehghani, Shervin, Sommersperger, Michael, Zhang, Peiyao, Martin-Gomez, Alejandro, Busam, Benjamin, Gehlbach, Peter, Navab, Nassir, Nasseri, M. Ali, Iordachita, Iulian
In the last decade, various robotic platforms have been introduced that could support delicate retinal surgeries. Concurrently, to provide semantic understanding of the surgical area, recent advances have enabled microscope-integrated intraoperative Optical Coherent Tomography (iOCT) with high-resolution 3D imaging at near video rate. The combination of robotics and semantic understanding enables task autonomy in robotic retinal surgery, such as for subretinal injection. This procedure requires precise needle insertion for best treatment outcomes. However, merging robotic systems with iOCT introduces new challenges. These include, but are not limited to high demands on data processing rates and dynamic registration of these systems during the procedure. In this work, we propose a framework for autonomous robotic navigation for subretinal injection, based on intelligent real-time processing of iOCT volumes. Our method consists of an instrument pose estimation method, an online registration between the robotic and the iOCT system, and trajectory planning tailored for navigation to an injection target. We also introduce intelligent virtual B-scans, a volume slicing approach for rapid instrument pose estimation, which is enabled by Convolutional Neural Networks (CNNs). Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method. Finally, we discuss identified challenges in this work and suggest potential solutions to further the development of such systems.